# Autoregressive text generation
Smt Grandstaff
MIT
This SMT model was fine-tuned on the Camera GrandStaff piano sheet dataset for piano sheet image transcription tasks.
Image-to-Text
S
antoniorv6
136
4
Llama 7b Hf
Other
A 7B-parameter open-source large language model developed by Meta AI, based on Transformer architecture, supporting multilingual processing including Chinese and English
Large Language Model
Transformers Supports Multiple Languages

L
luodian
1,014
36
OCR DocVQA Donut
MIT
Donut is an OCR-free document understanding Transformer model that combines a visual encoder and text decoder for document visual question answering tasks.
Image-to-Text
Transformers

O
jinhybr
240
13
Gpt2 Small
MIT
GPT-2 is an autoregressive language model based on the Transformer architecture. It is pre-trained on a large-scale English corpus through self-supervised learning and excels at text generation tasks.
Large Language Model
Transformers English

G
ComCom
1,032
3
CPM Generate
MIT
CPM is a Transformer-based 2.6 billion-parameter Chinese pre-trained language model, trained on 100GB of Chinese corpus, supporting various natural language processing tasks.
Large Language Model
Transformers Chinese

C
TsinghuaAI
622
42
Nb Gpt J 6B
Apache-2.0
Norwegian fine-tuned version based on GPT-J 6B, Transformer model with 6 billion parameters
Large Language Model
Transformers Other

N
NbAiLab
479
20
Featured Recommended AI Models